Comment about AR spectral estimation Usually an estimate is produced by computing the AR theoretical spectrum at (ˆφ, ˆσ 2 ). With our Monte Carlo
|
|
- Ginger Higgins
- 5 years ago
- Views:
Transcription
1 Comment aout AR spectral estimation Usually an estimate is produced y computing the AR theoretical spectrum at (ˆφ, ˆσ 2 ). With our Monte Carlo simulation approach, for every draw (φ,σ 2 ), we can compute the spectrum and otain a draw for f(ω). Typically the mean of these draws will e similar to the spectrum at (ˆφ, ˆσ 2 ). With this posterior simulation, we have the possiility of computing quantiles, proaility intervals or simply a and for the spectral density. The purpose of the and is to get an idea of the uncertainty of the estimation. 164
2 EEG example. The next figure shows several spectrum curves for 50 draws of (φ,σ 2 ). Recall that the oject phsim has the draws of φ coefficients and sigma2, the draws for the variance of the error term σ
3 50 posterior samples of AR(10) spectrum spectrum frequency 166
4 a=ar(eeg,order=10,aic=f) a$ar=as.vector(apply(phsim,2,mean)) a$var=mean(sigma2) x=spec.ar(a,n.freq=500,plot=f) plot(2*pi*x$freq,x$spec,type="l",axes=f) axis(1) axis(2) for(i in 1:50){ a$ar=as.vector(phsim[i,]) a$var=sigma2[i] x=spec.ar(a,n.freq=500,plot=f) lines(2*pi*x$freq,x$spec) print(i) } 167
5 Portmanteau lack of fit test For this test we need to consider the estimated residuals for the AR model ˆǫ t = x t p j=1 ˆφ j x t j where ˆφ j is some estimator of the model parameters. The purpose of this test is to determine if the residuals are correlated or not.. The null hypothesis is H o : ρ 1 = ρ 2 =... = ρ K = 0 The proposed test statistic is: Q = n(n + 2) K (n k) 1 2 ˆρ k k=1 where ˆρ k is the sample ACF of the estimated residuals and K is a fixed integer. 168
6 The paper y Ljung and Box (1978), On a measure of lack of fit in time series models, Biometrika, 65, shows that under the null hypothesis, Q approximately follows a a chi-square distriution with K (p + 1) degrees of freedom or Q χ 2 K (p+1) The testing procedure is: reject the null hypothesis at the α level if Q > χ 2 K (p+1)(1 α). where χ 2 K (p+1)(1 α) is the (1 α) quantile of the chi-square distriution with K (p + 1) degrees of freedom. A prolem with this test is that there is no formal rule to select the value K. 169
7 A common approach is to compute the p-value of test for different values of K. fit=arima(eeg,order=c(10,0,0)) tsdiag(fit) 170
8 Standardized Residuals Time ACF of Residuals ACF Lag p values for Ljung Box statistic p value lag 171
9 Model order via likelihood approaches: AIC, BIC We want to define a criteria that allows to select the order p of an AR process. We are thinking of the AR model as a linear regression model with p covariates. As p increases the likelihood (or log-likelihood) of the model evaluated at the MLE (ˆφ,s 2 ) also increases. However, as p increases we may have high autocorrelations of regressors. A penalty function could e added to the likelihood function to compensate for more parameters in the model. A general selection criteria is to find the value of p such 172
10 that minimizes 2log[L(ˆφ,s 2 )] + f(p) where L( ) is the likelihood function of the regression model and f( ) is a penalty function. This penalty function f(p) is assumed to e an increasing function of p. Since we are working with a Normal linear model, we can show that 2ln[L(ˆφ,s 2 )] = m(log(2π + 1)) + mlog(s 2 p) where m = n p is the length of the response vector In fact, for the AR model x = Fφ + ǫ, the likelihood 173
11 function L(φ,s 2 ) = ( ) 1 m/2 ( 2πσ 2 exp 1 ) 2σ 2(x Fφ) (x Fφ) Recall that the MLE, ˆφ = (F F) 1 F x and s 2 = (x F ˆφ) (x F ˆφ)/m and so ( ) 1 m/2 ( L(ˆφ,s 2 ) = 2πs 2 exp m ) 2 The first term of 2ln[L(ˆφ,s 2 )] does not depend on p. The criteria reduces to find the value of p for which is minimum. nlog(s 2 p) + f(p) The evaulation must ased on a common sample size. We 174
12 fix a maximum order p and fit AR models for values of p p ased only on n = n p oservations. Then we compute n log(s 2 p) + f(p);p = 0,1,...,p and find the max over the range 0,1,... p If we set f(p) = 2p, we have the Akaike information criteria (AIC). This AIC tends to give overestimated values of p. If we fix f(p) = log(n )p we have the Bayesian information criteria (BIC). The BIC tends to give smaller values of p in comparison to AIC. 175
13 log likelihood a a a a a a a a a a a a a a a a a a AR order 176
14 Forecasting with AR models We will consider forecasting from oth Bayesian and non-bayesian perspectives. We wish to produce inference aout the future. From time n, we wish to produce a statement aout X n+1,x n+2,...,x n+h where h is the forecasting horizon (how far we wish to predict in time). In a Bayesian setup, this translates into considering the Predictive distriution for the future values, p(x n+h, x n+h 1,...,x n+1 x n,...,x 1 ) = p(xn+h, x n+h 1,...,x n+1 x n,...,x 1, φ, σ 2 )p(φ, σ 2 )dφdσ 2 For AR models even with the non-informative prior 177
15 p(φ,σ 2 ) 1/σ 2, this distriution does not have a recognizale form. However, using posterior simulation it is relatively simple to otain samples of values for X n+1,x n+2,...,x n+h We can proceed in the following way: Draw a pair (φ,σ 2 ) from the Normal-Inverse Gamma distriution as we discussed efore. Using this pair, draw a value x n+1 from a Normal distriution with mean p j=1 φ jx n+1 j and variance σ 2. Draw x n+2 from a Normal distriution with mean p j=1 φ jx n+2 j and variance σ 2. (In one of the terms of the autoregression we are using the draw for x n+1 ). 178
16 Continue in this way until we generate a value for x n+h from a Normal with mean p j=1 φ jx n+h j and variance σ 2 Repeat all the steps until we otain M samples of values x n+1,x n+2,...,x n+h An approximation to this scheme is to make draws from a predictive distriution which is conditional to an estimate of the model parameters (ˆφ, ˆσ 2 ) p(x n+h,x n+h 1,...,x n+1 ˆφ, ˆσ 2,x n,x n 1,...,x 2,x 1 ). We are treating (ˆφ, ˆσ 2 ) as the true parameter. If the sample size n is large this should produce similar results with respect to full Bayesian approach that uses 179
17 draws of (φ,σ 2 ). However, if the sample size is small we could find differences etween the distriutions. Once again, consider the EEG data with an AR(10) model. The figures show: Samples of predictive values and data. Comparison of full predictive with MLE predictive Posterior mean of forecasts. Posterior mean and 95% predictive forecasts. Parts of code included in file code6.s 180
18 EEG data and 4 samples from the predictive distriution eeg eeg eeg eeg
19 Sample at MLE compared to sample from predictive eeg eeg
20 EEG data and posterior mean for forecasts eeg time 183
21 EEG time 184
22 # function to produce forecasts # ph are the model coefficients, #h is the forecasting horizon # zt last p values of time series forcar=function(ph,v,h,zt) { x=rep(na,h);p=length(zt) for(i in 1:h) { x[i]=sum(ph*zt)+sqrt(v)*rnorm(1) zt[2:p]=zt[1:(p-1)] zt[1]=x[i] } return(x) 185
23 } p=10 zt=rev(eeg[(n-p+1):n]) forcar(phsim[10,],sigma2[10],200,zt) forcar(phhat,s,200,zt) # 500 samples and mean fr=matrix(na,200,500) for(i in 1:500){ fr[,i]=forcar(phsim[i,],sigma2[i],200,zt) } meanfor=apply(fr,1,mean) 186
Recall that the AR(p) model is defined by the equation
Estimation of AR models Recall that the AR(p) model is defined by the equation X t = p φ j X t j + ɛ t j=1 where ɛ t are assumed independent and following a N(0, σ 2 ) distribution. Assume p is known and
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More informationdistributed approximately according to white noise. Likewise, for general ARMA(p,q), the residuals can be expressed as
library(forecast) log_ap
More informationAR, MA and ARMA models
AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For
More informationVector Autoregressive Model. Vector Autoregressions II. Estimation of Vector Autoregressions II. Estimation of Vector Autoregressions I.
Vector Autoregressive Model Vector Autoregressions II Empirical Macroeconomics - Lect 2 Dr. Ana Beatriz Galvao Queen Mary University of London January 2012 A VAR(p) model of the m 1 vector of time series
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1
ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1 Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Forecast Examples: Part 1 ECON/FIN
More informationMISCELLANEOUS TOPICS RELATED TO LIKELIHOOD. Copyright c 2012 (Iowa State University) Statistics / 30
MISCELLANEOUS TOPICS RELATED TO LIKELIHOOD Copyright c 2012 (Iowa State University) Statistics 511 1 / 30 INFORMATION CRITERIA Akaike s Information criterion is given by AIC = 2l(ˆθ) + 2k, where l(ˆθ)
More informationLab: Box-Jenkins Methodology - US Wholesale Price Indicator
Lab: Box-Jenkins Methodology - US Wholesale Price Indicator In this lab we explore the Box-Jenkins methodology by applying it to a time-series data set comprising quarterly observations of the US Wholesale
More informationIntroduction to Bayesian Inference
University of Pennsylvania EABCN Training School May 10, 2016 Bayesian Inference Ingredients of Bayesian Analysis: Likelihood function p(y φ) Prior density p(φ) Marginal data density p(y ) = p(y φ)p(φ)dφ
More informationUnivariate ARIMA Models
Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.
More informationForecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1
Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation
More informationThe linear model is the most fundamental of all serious statistical models encompassing:
Linear Regression Models: A Bayesian perspective Ingredients of a linear model include an n 1 response vector y = (y 1,..., y n ) T and an n p design matrix (e.g. including regressors) X = [x 1,..., x
More informationAdvanced Econometrics
Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco
More informationSTT 843 Key to Homework 1 Spring 2018
STT 843 Key to Homework Spring 208 Due date: Feb 4, 208 42 (a Because σ = 2, σ 22 = and ρ 2 = 05, we have σ 2 = ρ 2 σ σ22 = 2/2 Then, the mean and covariance of the bivariate normal is µ = ( 0 2 and Σ
More informationCHAPTER 8 MODEL DIAGNOSTICS. 8.1 Residual Analysis
CHAPTER 8 MODEL DIAGNOSTICS We have now discussed methods for specifying models and for efficiently estimating the parameters in those models. Model diagnostics, or model criticism, is concerned with testing
More informationModelling using ARMA processes
Modelling using ARMA processes Step 1. ARMA model identification; Step 2. ARMA parameter estimation Step 3. ARMA model selection ; Step 4. ARMA model checking; Step 5. forecasting from ARMA models. 33
More informationCh 8. MODEL DIAGNOSTICS. Time Series Analysis
Model diagnostics is concerned with testing the goodness of fit of a model and, if the fit is poor, suggesting appropriate modifications. We shall present two complementary approaches: analysis of residuals
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationTIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA
CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis
More informationFinQuiz Notes
Reading 9 A time series is any series of data that varies over time e.g. the quarterly sales for a company during the past five years or daily returns of a security. When assumptions of the regression
More informationEconometrics I: Univariate Time Series Econometrics (1)
Econometrics I: Dipartimento di Economia Politica e Metodi Quantitativi University of Pavia Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 Overview of the Lecture 1 st EViews
More informationVector Autoregression
Vector Autoregression Jamie Monogan University of Georgia February 27, 2018 Jamie Monogan (UGA) Vector Autoregression February 27, 2018 1 / 17 Objectives By the end of these meetings, participants should
More informationMAT3379 (Winter 2016)
MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation
More information10. Time series regression and forecasting
10. Time series regression and forecasting Key feature of this section: Analysis of data on a single entity observed at multiple points in time (time series data) Typical research questions: What is the
More informationLesson 8: Testing for IID Hypothesis with the correlogram
Lesson 8: Testing for IID Hypothesis with the correlogram Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Testing for i.i.d. Hypothesis
More informationTesting for IID Noise/White Noise: I
Testing for IID Noise/White Noise: I want to be able to test null hypothesis time series {x t } or set of residuals {r t } is IID(0, 2 ) or WN(0, 2 ) there are many such tests, including informal test
More informationTMA4285 December 2015 Time series models, solution.
Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that
More information1 Introduction. 2 AIC versus SBIC. Erik Swanson Cori Saviano Li Zha Final Project
Erik Swanson Cori Saviano Li Zha Final Project 1 Introduction In analyzing time series data, we are posed with the question of how past events influences the current situation. In order to determine this,
More informationAkaike criterion: Kullback-Leibler discrepancy
Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ
More informationEstimating AR/MA models
September 17, 2009 Goals The likelihood estimation of AR/MA models AR(1) MA(1) Inference Model specification for a given dataset Why MLE? Traditional linear statistics is one methodology of estimating
More informationMultivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]
1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More informationSTAT 520 FORECASTING AND TIME SERIES 2013 FALL Homework 05
STAT 520 FORECASTING AND TIME SERIES 2013 FALL Homework 05 1. ibm data: The random walk model of first differences is chosen to be the suggest model of ibm data. That is (1 B)Y t = e t where e t is a mean
More informationCross-sectional space-time modeling using ARNN(p, n) processes
Cross-sectional space-time modeling using ARNN(p, n) processes W. Polasek K. Kakamu September, 006 Abstract We suggest a new class of cross-sectional space-time models based on local AR models and nearest
More informationUnit root problem, solution of difference equations Simple deterministic model, question of unit root
Unit root problem, solution of difference equations Simple deterministic model, question of unit root (1 φ 1 L)X t = µ, Solution X t φ 1 X t 1 = µ X t = A + Bz t with unknown z and unknown A (clearly X
More informationMultivariate Time Series
Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form
More informationLecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay
Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr R Tsay An effective procedure for building empirical time series models is the Box-Jenkins approach, which consists of three stages: model
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationLinear Models A linear model is defined by the expression
Linear Models A linear model is defined by the expression x = F β + ɛ. where x = (x 1, x 2,..., x n ) is vector of size n usually known as the response vector. β = (β 1, β 2,..., β p ) is the transpose
More informationAR(p) + I(d) + MA(q) = ARIMA(p, d, q)
AR(p) + I(d) + MA(q) = ARIMA(p, d, q) Outline 1 4.1: Nonstationarity in the Mean 2 ARIMA Arthur Berg AR(p) + I(d)+ MA(q) = ARIMA(p, d, q) 2/ 19 Deterministic Trend Models Polynomial Trend Consider the
More informationINTRODUCTION TO TIME SERIES ANALYSIS. The Simple Moving Average Model
INTRODUCTION TO TIME SERIES ANALYSIS The Simple Moving Average Model The Simple Moving Average Model The simple moving average (MA) model: More formally: where t is mean zero white noise (WN). Three parameters:
More informationMidterm Suggested Solutions
CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)
More informationStatistics 203: Introduction to Regression and Analysis of Variance Course review
Statistics 203: Introduction to Regression and Analysis of Variance Course review Jonathan Taylor - p. 1/?? Today Review / overview of what we learned. - p. 2/?? General themes in regression models Specifying
More informationChapter 8: Model Diagnostics
Chapter 8: Model Diagnostics Model diagnostics involve checking how well the model fits. If the model fits poorly, we consider changing the specification of the model. A major tool of model diagnostics
More informationBayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference
1 The views expressed in this paper are those of the authors and do not necessarily reflect the views of the Federal Reserve Board of Governors or the Federal Reserve System. Bayesian Estimation of DSGE
More informationTesting for IID Noise/White Noise: I
Testing for IID Noise/White Noise: I want to be able to test null hypothesis that time series {x t } or set of residuals {r t } is IID(0, σ 2 ) or WN(0, σ 2 ) many such tests exist, including informal
More informationIntroductory Econometrics
Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna November 23, 2013 Outline Introduction
More informationFall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.
1. Let P be a probability measure on a collection of sets A. (a) For each n N, let H n be a set in A such that H n H n+1. Show that P (H n ) monotonically converges to P ( k=1 H k) as n. (b) For each n
More informationThe Mean Version One way to write the One True Regression Line is: Equation 1 - The One True Line
Chapter 27: Inferences for Regression And so, there is one more thing which might vary one more thing aout which we might want to make some inference: the slope of the least squares regression line. The
More informationBox-Jenkins ARIMA Advanced Time Series
Box-Jenkins ARIMA Advanced Time Series www.realoptionsvaluation.com ROV Technical Papers Series: Volume 25 Theory In This Issue 1. Learn about Risk Simulator s ARIMA and Auto ARIMA modules. 2. Find out
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationDynamic Time Series Regression: A Panacea for Spurious Correlations
International Journal of Scientific and Research Publications, Volume 6, Issue 10, October 2016 337 Dynamic Time Series Regression: A Panacea for Spurious Correlations Emmanuel Alphonsus Akpan *, Imoh
More informationLATVIAN GDP: TIME SERIES FORECASTING USING VECTOR AUTO REGRESSION
LATVIAN GDP: TIME SERIES FORECASTING USING VECTOR AUTO REGRESSION BEZRUCKO Aleksandrs, (LV) Abstract: The target goal of this work is to develop a methodology of forecasting Latvian GDP using ARMA (AutoRegressive-Moving-Average)
More information1 Hypothesis Testing and Model Selection
A Short Course on Bayesian Inference (based on An Introduction to Bayesian Analysis: Theory and Methods by Ghosh, Delampady and Samanta) Module 6: From Chapter 6 of GDS 1 Hypothesis Testing and Model Selection
More informationCircle a single answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle
More informationAppendix 1 Model Selection: GARCH Models. Parameter estimates and summary statistics for models of the form: 1 if ɛt i < 0 0 otherwise
Appendix 1 Model Selection: GARCH Models Parameter estimates and summary statistics for models of the form: R t = µ + ɛ t ; ɛ t (0, h 2 t ) (1) h 2 t = α + 2 ( 2 ( 2 ( βi ht i) 2 + γi ɛt i) 2 + δi D t
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 11 January 7, 2013 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline How to communicate the statistical uncertainty
More informationStationary Stochastic Time Series Models
Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic
More informationEmpirical Economic Research, Part II
Based on the text book by Ramanathan: Introductory Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 7, 2011 Outline Introduction
More informationA732: Exercise #7 Maximum Likelihood
A732: Exercise #7 Maximum Likelihood Due: 29 Novemer 2007 Analytic computation of some one-dimensional maximum likelihood estimators (a) Including the normalization, the exponential distriution function
More informationNon-independence due to Time Correlation (Chapter 14)
Non-independence due to Time Correlation (Chapter 14) When we model the mean structure with ordinary least squares, the mean structure explains the general trends in the data with respect to our dependent
More informationParameter estimation: ACVF of AR processes
Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). 12 novembre 2013 1 / 8 Parameter estimation:
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationMetropolis Hastings. Rebecca C. Steorts Bayesian Methods and Modern Statistics: STA 360/601. Module 9
Metropolis Hastings Rebecca C. Steorts Bayesian Methods and Modern Statistics: STA 360/601 Module 9 1 The Metropolis-Hastings algorithm is a general term for a family of Markov chain simulation methods
More informationARIMA Modelling and Forecasting
ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first
More informationCircle the single best answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice
More informationSize Distortion and Modi cation of Classical Vuong Tests
Size Distortion and Modi cation of Classical Vuong Tests Xiaoxia Shi University of Wisconsin at Madison March 2011 X. Shi (UW-Mdsn) H 0 : LR = 0 IUPUI 1 / 30 Vuong Test (Vuong, 1989) Data fx i g n i=1.
More informationibm: daily closing IBM stock prices (dates not given) internet: number of users logged on to an Internet server each minute (dates/times not given)
Remark: Problem 1 is the most important problem on this assignment (it will prepare you for your project). Problem 2 was taken largely from last year s final exam. Problem 3 consists of a bunch of rambling
More informationAnalysis. Components of a Time Series
Module 8: Time Series Analysis 8.2 Components of a Time Series, Detection of Change Points and Trends, Time Series Models Components of a Time Series There can be several things happening simultaneously
More informationA Significance Test for the Lasso
A Significance Test for the Lasso Lockhart R, Taylor J, Tibshirani R, and Tibshirani R Ashley Petersen May 14, 2013 1 Last time Problem: Many clinical covariates which are important to a certain medical
More informationIndex. Regression Models for Time Series Analysis. Benjamin Kedem, Konstantinos Fokianos Copyright John Wiley & Sons, Inc. ISBN.
Regression Models for Time Series Analysis. Benjamin Kedem, Konstantinos Fokianos Copyright 0 2002 John Wiley & Sons, Inc. ISBN. 0-471-36355-3 Index Adaptive rejection sampling, 233 Adjacent categories
More informationProblem Set 2: Box-Jenkins methodology
Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +
More informationBayesian Linear Models
Bayesian Linear Models Sudipto Banerjee 1 and Andrew O. Finley 2 1 Department of Forestry & Department of Geography, Michigan State University, Lansing Michigan, U.S.A. 2 Biostatistics, School of Public
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationA TIME SERIES PARADOX: UNIT ROOT TESTS PERFORM POORLY WHEN DATA ARE COINTEGRATED
A TIME SERIES PARADOX: UNIT ROOT TESTS PERFORM POORLY WHEN DATA ARE COINTEGRATED by W. Robert Reed Department of Economics and Finance University of Canterbury, New Zealand Email: bob.reed@canterbury.ac.nz
More informationData Mining Chapter 4: Data Analysis and Uncertainty Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University
Data Mining Chapter 4: Data Analysis and Uncertainty Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Why uncertainty? Why should data mining care about uncertainty? We
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationIntroduction to Time Series Analysis. Lecture 11.
Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker
More information22s:152 Applied Linear Regression. Returning to a continuous response variable Y...
22s:152 Applied Linear Regression Generalized Least Squares Returning to a continuous response variable Y... Ordinary Least Squares Estimation The classical models we have fit so far with a continuous
More informationBIO5312 Biostatistics Lecture 13: Maximum Likelihood Estimation
BIO5312 Biostatistics Lecture 13: Maximum Likelihood Estimation Yujin Chung November 29th, 2016 Fall 2016 Yujin Chung Lec13: MLE Fall 2016 1/24 Previous Parametric tests Mean comparisons (normality assumption)
More informationA test for improved forecasting performance at higher lead times
A test for improved forecasting performance at higher lead times John Haywood and Granville Tunnicliffe Wilson September 3 Abstract Tiao and Xu (1993) proposed a test of whether a time series model, estimated
More informationBayesian Linear Models
Bayesian Linear Models Sudipto Banerjee 1 and Andrew O. Finley 2 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of Forestry & Department
More informationLinear models and their mathematical foundations: Simple linear regression
Linear models and their mathematical foundations: Simple linear regression Steffen Unkel Department of Medical Statistics University Medical Center Göttingen, Germany Winter term 2018/19 1/21 Introduction
More informationLecture 15. Hypothesis testing in the linear model
14. Lecture 15. Hypothesis testing in the linear model Lecture 15. Hypothesis testing in the linear model 1 (1 1) Preliminary lemma 15. Hypothesis testing in the linear model 15.1. Preliminary lemma Lemma
More informationThe Bootstrap: Theory and Applications. Biing-Shen Kuo National Chengchi University
The Bootstrap: Theory and Applications Biing-Shen Kuo National Chengchi University Motivation: Poor Asymptotic Approximation Most of statistical inference relies on asymptotic theory. Motivation: Poor
More information22s:152 Applied Linear Regression. In matrix notation, we can write this model: Generalized Least Squares. Y = Xβ + ɛ with ɛ N n (0, Σ)
22s:152 Applied Linear Regression Generalized Least Squares Returning to a continuous response variable Y Ordinary Least Squares Estimation The classical models we have fit so far with a continuous response
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationAutoregressive Integrated Moving Average Model to Predict Graduate Unemployment in Indonesia
DOI 10.1515/ptse-2017-0005 PTSE 12 (1): 43-50 Autoregressive Integrated Moving Average Model to Predict Graduate Unemployment in Indonesia Umi MAHMUDAH u_mudah@yahoo.com (State Islamic University of Pekalongan,
More informationSTA 6857 ARIMA and SARIMA Models ( 3.8 and 3.9)
STA 6857 ARIMA and SARIMA Models ( 3.8 and 3.9) Outline 1 Building ARIMA Models 2 SARIMA 3 Homework 4c Arthur Berg STA 6857 ARIMA and SARIMA Models ( 3.8 and 3.9) 2/ 34 Outline 1 Building ARIMA Models
More informationLecture 1: Stationary Time Series Analysis
Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationGaussian Copula Regression Application
International Mathematical Forum, Vol. 11, 2016, no. 22, 1053-1065 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/imf.2016.68118 Gaussian Copula Regression Application Samia A. Adham Department
More informationAutoregressive distributed lag models
Introduction In economics, most cases we want to model relationships between variables, and often simultaneously. That means we need to move from univariate time series to multivariate. We do it in two
More informationSTA 6857 ARIMA and SARIMA Models ( 3.8 and 3.9) Outline. Return Rate. US Gross National Product
STA 6857 ARIMA and SARIMA Models ( 3.8 and 3.9) Outline 1 Building ARIMA Models 2 SARIMA 3 Homework 4c Arthur Berg STA 6857 ARIMA and SARIMA Models ( 3.8 and 3.9) 2/ 34 Return Rate Suppose x t is the value
More informationCh 2: Simple Linear Regression
Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component
More information5 Autoregressive-Moving-Average Modeling
5 Autoregressive-Moving-Average Modeling 5. Purpose. Autoregressive-moving-average (ARMA models are mathematical models of the persistence, or autocorrelation, in a time series. ARMA models are widely
More informationNonlinear Time Series
Nonlinear Time Series Recall that a linear time series {X t } is one that follows the relation, X t = µ + i=0 ψ i A t i, where {A t } is iid with mean 0 and finite variance. A linear time series is stationary
More informationBrandon C. Kelly (Harvard Smithsonian Center for Astrophysics)
Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics) Probability quantifies randomness and uncertainty How do I estimate the normalization and logarithmic slope of a X ray continuum, assuming
More informationSerial Correlation. Edps/Psych/Stat 587. Carolyn J. Anderson. Fall Department of Educational Psychology
Serial Correlation Edps/Psych/Stat 587 Carolyn J. Anderson Department of Educational Psychology c Board of Trustees, University of Illinois Fall 017 Model for Level 1 Residuals There are three sources
More informationStandard Errors & Confidence Intervals. N(0, I( β) 1 ), I( β) = [ 2 l(β, φ; y) β i β β= β j
Standard Errors & Confidence Intervals β β asy N(0, I( β) 1 ), where I( β) = [ 2 l(β, φ; y) ] β i β β= β j We can obtain asymptotic 100(1 α)% confidence intervals for β j using: β j ± Z 1 α/2 se( β j )
More information